Minimax Rate Optimal Adaptive Nearest Neighbor Classification and Regression

نویسندگان

چکیده

k Nearest Neighbor (kNN) method is a simple and popular statistical for classification regression. For both regression problems, existing works have shown that, if the distribution of feature vector has bounded support probability density function away from zero in its support, convergence rate standard kNN method, which same all test samples, minimax optimal. On contrary, unbounded we show that there gap between achieved by bound. To close this gap, propose an adaptive different selected samples. Our selection rule does not require precise knowledge underlying features. The proposed significantly outperforms one. We characterize it matches lower

منابع مشابه

Discriminant Adaptive Nearest Neighbor Classification and Regression

Robert Tibshirani Department of Statistics University of Toronto tibs@utstat .toronto.edu Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to e...

متن کامل

Adaptive Metric nearest Neighbor Classification

Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...

متن کامل

Discriminant Adaptive Nearest Neighbor Classification

Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to estimate an effective metric for computing neighborhoods. We determine the local decision b...

متن کامل

Adaptive Kernel Metric Nearest Neighbor Classification

Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...

متن کامل

Locally Adaptive Metric Nearest-Neighbor Classification

ÐNearest-neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest-neighbor rule. We propose a locally adaptive nearest-neighbor classification method to try to minimize bias. We use a Chi-s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2021

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2021.3062078